707 resultados para Performance improvements

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this article is to examine the role of the alignment between technological innovation effectiveness and operational effectiveness after the implementation of enterprise information systems, and the impact of this alignment on the improvement in operational performance. Confirmatory factor analysis was used to examine structural relationships between the set of observed variables and the set of continuous latent variables. The findings from this research suggest that the dimensions stemming from technological innovation effectiveness such as system quality, information quality, service quality, user satisfaction and the performance objectives stemming from operational effectiveness such as cost, quality, reliability, flexibility and speed are important and significantly well-correlated factors. These factors promote the alignment between technological innovation effectiveness and operational effectiveness and should be the focus for managers in achieving effective implementation of technological innovations. In addition, there is a significant and direct influence of this alignment on the improvement of operational performance. The principal limitation of this study is that the findings are based on investigation of small sample size.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A fundamental principle of the resource-based (RBV) of the firm is that the basis for a competitive advantage lies primarily in the application of bundles of valuable strategic capabilities and resources at a firm’s or supply chain’s disposal. These capabilities enact research activities and outputs produced by industry funded R&D bodies. Such industry lead innovations are seen as strategic industry resources, because effective utilization of industry innovation capacity by sectors such as the Australian beef industry are critical, if productivity levels are to increase. Academics and practitioners often maintain that dynamic supply chains and innovation capacity are the mechanisms most likely to deliver performance improvements in national industries.. Yet many industries are still failing to capitalise on these strategic resources. In this research, we draw on the resource-based view (RBV) and embryonic research into strategic supply chain capabilities. We investigate how two strategic supply chain capabilities (supply chain performance differential capability and supply chain dynamic capability) influence industry-led innovation capacity utilization and provide superior performance enhancements to the supply chain. In addition, we examine the influence of size of the supply chain operative as a control variable. Results indicate that both small and large supply chain operatives in this industry believe these strategic capabilities influence and function as second-order latent variables of this strategic supply chain resource. Additionally respondents acknowledge size does impacts both the amount of influence these strategic capabilities have and the level of performance enhancement expected by supply chain operatives from utilizing industry-led innovation capacity. Results however also indicate contradiction in this industry and in relation to existing literature when it comes to utilizing such e-resources.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Better management of knowledge assets has the potential to improve business processes and increase productivity. This fact has led to considerable interest in recent years in the knowledge management (KM) phenomenon, and in the main dimensions that can impact on its application in construction. However, a lack of a systematic way of assessing KM initia-tives’ contribution towards achieving organisational business objectives is evident. This paper describes the first stage of a research project intended to develop, and empirically test, a KM input-process-output framework comprising unique and well-defined theoretical constructs representing the KM process and its internal and external determinants in the context of con-struction. The paper presents the underlying principles used in operationally defining each construct through the use of extant KM literature. The KM process itself is explicitly mod-elled via a number of clearly articulated phases that ultimately lead to knowledge utilisation and capitalisation, which in turn adds value or otherwise to meeting defined business objec-tives. The main objective of the model is to reduce the impact of subjectivity in assessing the contribution made by KM practices and initiatives toward achieving performance improvements.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Australia’s building stock includes many older commercial buildings with numerous factors that impact energy performance and indoor environment quality. The built environment industry has generally focused heavily on improving physical building design elements for greater energy efficiency (such as retrofits and environmental upgrades), however there are noticeable ‘upper limits’ to performance improvements in these areas. To achieve a stepchange improvement in building performance, the authors propose that additional components need to be addressed in a whole of building approach, including the way building design elements are managed and the level of stakeholder engagement between owners, tenants and building managers. This paper focuses on the opportunities provided by this whole-of-building approach, presenting the findings of a research project undertaken through the Sustainable Built Environment National Research Centre (SBEnrc) in Australia. Researchers worked with a number of industry partners over two years to investigate issues facing stakeholders at base building and tenancy levels, and the barriers to improving building performance. Through a mixed-method, industry-led research approach, five ‘nodes’ were identified in whole-of-building performance evaluation, each with interlinking and overlapping complexities that can influence performance. The nodes cover building management, occupant experience, indoor environment quality, agreements and culture, and design elements. This paper outlines the development and testing of these nodes and their interactions, and the resultant multi-nodal tool, called the ‘Performance Nexus’ tool. The tool is intended to be of most benefit in evaluating opportunities for performance improvement in the vast number of existing low-performing building stock.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This study examined the effect of exercise intensity and duration during 5-day heat acclimation (HA) on cycling performance and neuromuscular responses. 20 recreationally trained males completed a ‘baseline’ trial followed by 5 consecutive days HA, and a ‘post-acclimation’ trial. Baseline and post-acclimation trials consisted of maximal voluntary contractions (MVC), a single and repeated countermovement jump protocol, 20 km cycling time trial(TT) and 5x6 s maximal sprints (SPR). Cycling trials were undertaken in 33.0 ± 0.8 °C and 60 ± 3% relative humidity.Core(Tcore), and skin temperatures (Tskin), heart rate (HR), rating of perceived exertion (RPE) and thermal sensation were recorded throughout cycling trials. Participants were assigned to either 30 min high-intensity (30HI) or 90 min low-intensity (90LI) cohorts for HA, conducted in environmental conditions of 32.0 ± 1.6 °C. Percentage change time to complete the 20 km TT for the 90LI cohort was significantly improved post-acclimation(-5.9 ± 7.0%; P=0.04) compared to the 30HI cohort (-0.18 ± 3.9%; P<0.05). The 30HI cohort showed greatest improvements in power output (PO) during post-acclimation SPR1 and 2 compared to 90LI (546 ± 128 W and 517 ± 87 W,respectively; P<0.02). No differences were evident for MVC within 30HI cohort, however, a reduced performance indicated by % change within the 90LI (P=0.04). Compared to baseline, mean Tcore was reduced post-acclimation within the 30HI cohort (P=0.05) while mean Tcore and HR were significantly reduced within the 90LI cohort (P=0.01 and 0.04, respectively). Greater physiological adaptations and performance improvements were noted within the 90LI cohort compared to the 30HI. However, 30HI did provide some benefit to anaerobic performance including sprint PO and MVC. These findings suggest specifying training duration and intensity during heat acclimation may be useful for specific post-acclimation performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hybrid face recognition, using image (2D) and structural (3D) information, has explored the fusion of Nearest Neighbour classifiers. This paper examines the effectiveness of feature modelling for each individual modality, 2D and 3D. Furthermore, it is demonstrated that the fusion of feature modelling techniques for the 2D and 3D modalities yields performance improvements over the individual classifiers. By fusing the feature modelling classifiers for each modality with equal weights the average Equal Error Rate improves from 12.60% for the 2D classifier and 12.10% for the 3D classifier to 7.38% for the Hybrid 2D+3D clasiffier.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a new method of using foreground silhouette images for human pose estimation. Labels are introduced to the silhouette images, providing an extra layer of information that can be used in the model fitting process. The pixels in the silhouettes are labelled according to the corresponding body part in the model of the current fit, with the labels propagated into the silhouette of the next frame to be used in the fitting for the next frame. Both single and multi-view implementations are detailed, with results showing performance improvements over only using standard unlabelled silhouettes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study assesses the recently proposed data-driven background dataset refinement technique for speaker verification using alternate SVM feature sets to the GMM supervector features for which it was originally designed. The performance improvements brought about in each trialled SVM configuration demonstrate the versatility of background dataset refinement. This work also extends on the originally proposed technique to exploit support vector coefficients as an impostor suitability metric in the data-driven selection process. Using support vector coefficients improved the performance of the refined datasets in the evaluation of unseen data. Further, attempts are made to exploit the differences in impostor example suitability measures from varying features spaces to provide added robustness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Facial expression is an important channel for human communication and can be applied in many real applications. One critical step for facial expression recognition (FER) is to accurately extract emotional features. Current approaches on FER in static images have not fully considered and utilized the features of facial element and muscle movements, which represent static and dynamic, as well as geometric and appearance characteristics of facial expressions. This paper proposes an approach to solve this limitation using ‘salient’ distance features, which are obtained by extracting patch-based 3D Gabor features, selecting the ‘salient’ patches, and performing patch matching operations. The experimental results demonstrate high correct recognition rate (CRR), significant performance improvements due to the consideration of facial element and muscle movements, promising results under face registration errors, and fast processing time. The comparison with the state-of-the-art performance confirms that the proposed approach achieves the highest CRR on the JAFFE database and is among the top performers on the Cohn-Kanade (CK) database.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numerous tools and techniques have been developed to eliminate or reduce waste and carry out Lean concepts in the manufacturing environment. However, in practice, manufacturers encounter difficulties to clearly identify the weaknesses of the existing processes in order to address them by implementing Lean tools. Moreover, selection and implementation of appropriate Lean strategies to address the problems identified is a challenging task. According best of authors‟ knowledge, there is no method available to quantitatively evaluate the cost and benefits of implementing a Lean strategy to address the weaknesses in the manufacturing process. Therefore, benefits of Lean approaches cannot be clearly established. The authors developed a methodology to quantitatively measure the performances of a manufacturing system in detecting the causes of inefficiencies and to select appropriate Lean strategies to address the problems identified. The proposed methodology demonstrates that the Lean strategies should be implemented based on the contexts of the organization and identified problem in order to achieve maximum cost benefits. Finally, a case study has been presented to demonstrate how the procedure developed works in practical situation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Developing economies accommodate more than three quarters of the world's population. This means understanding their growth and well-being is of critical importance. Information technology (IT) is one resource that has had a profound effect in shaping the global economy. IT is also an important resource for driving growth and development in developing economies. Investments in developing economies, however, have focused on the exploitation of labor and natural resources. Unlike in developed economies, focus on IT investment to improve efficiency and effectiveness of business process in developing economies has been sparse, and mechanisms for deriving better IT-related business value is not well understood. This study develops a complementarities-based business value model for developing economies, and tests the relationship between IT investments, IT-related complementarities, and business process performance. It also considers the relationship between business processes performance and firm-level performance. The results suggest that a coordinated investment in IT and IT-related complementarities related favorably to business process performance. Improvements in process-level performance lead to improvements in firm-level performance. The results also suggest that the IT-related complementarities are not only a source of business value on their own, but also enhance the IT resources' ability to contribute to business process performance. This study demonstrates that a coordinated investment approach is required in developing economies. With this approach, their IT resources and IT-related complementaries would help them significantly in improving their business processes, and eventually their firm-level performances.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Search technologies are critical to enable clinical sta to rapidly and e ectively access patient information contained in free-text medical records. Medical search is challenging as terms in the query are often general but those in rel- evant documents are very speci c, leading to granularity mismatch. In this paper we propose to tackle granularity mismatch by exploiting subsumption relationships de ned in formal medical domain knowledge resources. In symbolic reasoning, a subsumption (or `is-a') relationship is a parent-child rela- tionship where one concept is a subset of another concept. Subsumed concepts are included in the retrieval function. In addition, we investigate a number of initial methods for combining weights of query concepts and those of subsumed concepts. Subsumption relationships were found to provide strong indication of relevant information; their inclusion in retrieval functions yields performance improvements. This result motivates the development of formal models of rela- tionships between medical concepts for retrieval purposes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.